Finite State Automata that Recurrent Cascade-Correlation Cannot Represent
نویسنده
چکیده
This paper relates the computational power of Fahlman' s Recurrent Cascade Correlation (RCC) architecture to that of fInite state automata (FSA). While some recurrent networks are FSA equivalent, RCC is not. The paper presents a theoretical analysis of the RCC architecture in the form of a proof describing a large class of FSA which cannot be realized by RCC.
منابع مشابه
Constructive learning of recurrent neural networks: limitations of recurrent cascade correlation and a simple solution
It is often difficult to predict the optimal neural network size for a particular application. Constructive or destructive methods that add or subtract neurons, layers, connections, etc. might offer a solution to this problem. We prove that one method, recurrent cascade correlation, due to its topology, has fundamental limitations in representation and thus in its learning capabilities. It cann...
متن کاملRCC Cannot Compute Certain FSA, Even with Arbitrary Transfer Functions
Existing proofs demonstrating the computational limitations of Recurrent Cascade Correlation and similar networks (Fahlman, 1991; Bachrach, 1988; Mozer, 1988) explicitly limit their results to units having sigmoidal or hard-threshold transfer functions (Giles et aI., 1995; and Kremer, 1996). The proof given here shows that for any finite, discrete transfer function used by the units of an RCC n...
متن کاملFuzzy Automata Induction using Construction Method
Recurrent neural networks have recently been demonstrated to have the ability to learn simple grammars. In particular, networks using second-order units have been successfully at this task. However, it is often difficult to predict the optimal neural network size to induce an unknown automaton from examples. Instead of just adjusting the weights in a network of fixed topology, we adopt the dyna...
متن کاملEvolutionary Training of Hybrid Systems of Recurrent Neural Networks and Hidden Markov Models
We present a hybrid architecture of recurrent neural networks (RNNs) inspired by hidden Markov models (HMMs). We train the hybrid architecture using genetic algorithms to learn and represent dynamical systems. We train the hybrid architecture on a set of deterministic finite-state automata strings and observe the generalization performance of the hybrid architecture when presented with a new se...
متن کاملReduction of Computational Complexity in Finite State Automata Explosion of Networked System Diagnosis (RESEARCH NOTE)
This research puts forward rough finite state automata which have been represented by two variants of BDD called ROBDD and ZBDD. The proposed structures have been used in networked system diagnosis and can overcome cominatorial explosion. In implementation the CUDD - Colorado University Decision Diagrams package is used. A mathematical proof for claimed complexity are provided which shows ZBDD ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1995